A Randomized Polynomial Kernelization for Vertex Cover with a Smaller Parameter
نویسنده
چکیده
In the vertex cover problem we are given a graph G = (V,E) and an integer k and have to determine whether there is a set X ⊆ V of size at most k such that each edge in E has at least one endpoint in X. The problem can be easily solved in time O∗(2k), making it fixed-parameter tractable (FPT) with respect to k. While the fastest known algorithm takes only time O∗(1.2738k), much stronger improvements have been obtained by studying parameters that are smaller than k. Apart from treewidth-related results, the arguably best algorithm for vertex cover runs in time O∗(2.3146p), where p = k−LP (G) is only the excess of the solution size k over the best fractional vertex cover (Lokshtanov et al. TALG 2014). Since p ≤ k but k cannot be bounded in terms of p alone, this strictly increases the range of tractable instances. Recently, Garg and Philip (SODA 2016) greatly contributed to understanding the parameterized complexity of the vertex cover problem. They prove that 2LP (G) −MM(G) is a lower bound for the vertex cover size of G, where MM(G) is the size of a largest matching of G, and proceed to study parameter l = k − (2LP (G)−MM(G)). They give an algorithm of running time O∗(3l), proving that vertex cover is FPT in l. It can be easily observed that l ≤ p whereas p cannot be bounded in terms of l alone. We complement the work of Garg and Philip by proving that vertex cover admits a randomized polynomial kernelization in terms of l, i.e., an efficient preprocessing to size polynomial in l. This improves over parameter p = k−LP (G) for which this was previously known (Kratsch and Wahlström FOCS 2012).
منابع مشابه
Vertex Cover Kernelization Revisited: Upper and Lower Bounds for a Refined Parameter
Kernelization is a concept that enables the formal mathematical analysis of data reduction through the framework of parameterized complexity. Intensive research into the Vertex Cover problem has shown that there is a preprocessing algorithm which given an instance (G, k) of Vertex Cover outputs an equivalent instance (G′, k′) in polynomial time with the guarantee that G′ has at most 2k′ vertice...
متن کاملCross-Composition: A New Technique for Kernelization Lower Bounds
We introduce a new technique for proving kernelization lower bounds, called cross-composition. A classical problem L cross-composes into a parameterized problem Q if an instance of Q with polynomially bounded parameter value can express the logical OR of a sequence of instances of L. Building on work by Bodlaender et al. (ICALP 2008) and using a result by Fortnow and Santhanam (STOC 2008) we sh...
متن کاملGraph-Modeled Data Clustering
ly speaking, what have we done in the previous section? After applying a number of rules in polynomial time to an instance of VERTEX COVER, we arrived at a reduced instance whose size can solely be expressed in terms of the parameter k. Since this can be easily done in O(n) time, we have found a data reduction for VERTEX COVER with guarantees concerning its running time as well as its effective...
متن کاملSmaller Parameters for Vertex Cover Kernelization
We revisit the topic of polynomial kernels for Vertex Cover relative to structural parameters. Our starting point is a recent paper due to Fomin and Str{\o}mme [WG 2016] who gave a kernel with $\mathcal{O}(|X|^{12})$ vertices when $X$ is a vertex set such that each connected component of $G-X$ contains at most one cycle, i.e., $X$ is a modulator to a pseudoforest. We strongly generalize this re...
متن کاملIncompressibility through Colors and IDs
In parameterized complexity each problem instance comes with a parameter k, and a parameterized problem is said to admit a polynomial kernel if there are polynomial time preprocessing rules that reduce the input instance to an instance with size polynomial in k. Many problems have been shown to admit polynomial kernels, but it is only recently that a framework for showing the non-existence of p...
متن کامل